123 research outputs found

    Comment on "Consistency, amplitudes, and probabilities in quantum theory"

    Full text link
    In a recent article [Phys. Rev. A 57, 1572 (1998)] Caticha has concluded that ``nonlinear variants of quantum mechanics are inconsistent.'' In this note we identify what it is that nonlinear quantum theories have been shown to be inconsistent with.Comment: LaTeX, 5 pages, no figure

    Entropic Dynamics, Time and Quantum Theory

    Full text link
    Quantum mechanics is derived as an application of the method of maximum entropy. No appeal is made to any underlying classical action principle whether deterministic or stochastic. Instead, the basic assumption is that in addition to the particles of interest x there exist extra variables y whose entropy S(x) depends on x. The Schr\"odinger equation follows from their coupled dynamics: the entropy S(x) drives the dynamics of the particles x while they in their turn determine the evolution of S(x). In this "entropic dynamics" time is introduced as a device to keep track of change. A welcome feature of such an entropic time is that it naturally incorporates an arrow of time. Both the magnitude and the phase of the wave function are given statistical interpretations: the magnitude gives the distribution of x in agreement with the usual Born rule and the phase carries information about the entropy S(x) of the extra variables. Extending the model to include external electromagnetic fields yields further insight into the nature of the quantum phase.Comment: 29 page

    Opinion Dynamics of Learning Agents: Does Seeking Consensus Lead to Disagreement?

    Full text link
    We study opinion dynamics in a population of interacting adaptive agents voting on a set of complex multidimensional issues. We consider agents which can classify issues into for or against. The agents arrive at the opinions about each issue in question using an adaptive algorithm. Adaptation comes from learning and the information for the learning process comes from interacting with other neighboring agents and trying to change the internal state in order to concur with their opinions. The change in the internal state is driven by the information contained in the issue and in the opinion of the other agent. We present results in a simple yet rich context where each agent uses a Boolean Perceptron to state its opinion. If there is no internal clock, so the update occurs with asynchronously exchanged information among pairs of agents, then the typical case, if the number of issues is kept small, is the evolution into a society thorn by the emergence of factions with extreme opposite beliefs. This occurs even when seeking consensus with agents with opposite opinions. The curious result is that it is learning from those that hold the same opinions that drives the emergence of factions. This results follows from the fact that factions are prevented by not learning at all from those agents that hold the same opinion. If the number of issues is large, the dynamics becomes trapped and the society does not evolve into factions and a distribution of moderate opinions is observed. We also study the less realistic, but technically simpler synchronous case showing that global consensus is a fixed point. However, the approach to this consensus is glassy in the limit of large societies if agents adapt even in the case of agreement.Comment: 16 pages, 10 figures, revised versio

    Bayesian Probabilities and the Histories Algebra

    Full text link
    We attempt a justification of a generalisation of the consistent histories programme using a notion of probability that is valid for all complete sets of history propositions. This consists of introducing Cox's axioms of probability theory and showing that our candidate notion of probability obeys them. We also give a generalisation of Bayes' theorem and comment upon how Bayesianism should be useful for the quantum gravity/cosmology programmes.Comment: 10 pages, accepted by Int. J. Theo. Phys. Feb 200

    Maximum Entropy and Bayesian Data Analysis: Entropic Priors

    Full text link
    The problem of assigning probability distributions which objectively reflect the prior information available about experiments is one of the major stumbling blocks in the use of Bayesian methods of data analysis. In this paper the method of Maximum (relative) Entropy (ME) is used to translate the information contained in the known form of the likelihood into a prior distribution for Bayesian inference. The argument is inspired and guided by intuition gained from the successful use of ME methods in statistical mechanics. For experiments that cannot be repeated the resulting "entropic prior" is formally identical with the Einstein fluctuation formula. For repeatable experiments, however, the expected value of the entropy of the likelihood turns out to be relevant information that must be included in the analysis. The important case of a Gaussian likelihood is treated in detail.Comment: 23 pages, 2 figure

    Is Tsallis thermodynamics nonextensive?

    Get PDF
    Within the Tsallis thermodynamics' framework, and using scaling properties of the entropy, we derive a generalization of the Gibbs-Duhem equation. The analysis suggests a transformation of variables that allows standard thermodynamics to be recovered. Moreover, we also generalize Einstein's formula for the probability of a fluctuation to occur by means of the maximum statistical entropy method. The use of the proposed transformation of variables also shows that fluctuations within Tsallis statistics can be mapped to those of standard statistical mechanics.Comment: 4 pages, no figures, revised version, new title, accepted in PR

    On The Complexity Of Statistical Models Admitting Correlations

    Full text link
    We compute the asymptotic temporal behavior of the dynamical complexity associated with the maximum probability trajectories on Gaussian statistical manifolds in presence of correlations between the variables labeling the macrostates of the system. The algorithmic structure of our asymptotic computations is presented and special focus is devoted to the diagonalization procedure that allows to simplify the problem in a remarkable way. We observe a power law decay of the information geometric complexity at a rate determined by the correlation coefficient. We conclude that macro-correlations lead to the emergence of an asymptotic information geometric compression of the statistical macrostates explored on the configuration manifold of the model in its evolution between the initial and final macrostates.Comment: 15 pages, no figures; improved versio

    Information-Geometric Indicators of Chaos in Gaussian Models on Statistical Manifolds of Negative Ricci Curvature

    Full text link
    A new information-geometric approach to chaotic dynamics on curved statistical manifolds based on Entropic Dynamics (ED) is proposed. It is shown that the hyperbolicity of a non-maximally symmetric 6N-dimensional statistical manifold M_{s} underlying an ED Gaussian model describing an arbitrary system of 3N degrees of freedom leads to linear information-geometric entropy growth and to exponential divergence of the Jacobi vector field intensity, quantum and classical features of chaos respectively.Comment: 8 pages, final version accepted for publicatio

    Microtubules: Montroll's kink and Morse vibrations

    Full text link
    Using a version of Witten's supersymmetric quantum mechanics proposed by Caticha, we relate Montroll's kink to a traveling, asymmetric Morse double-well potential suggesting in this way a connection between kink modes and vibrational degrees of freedom along microtubulesComment: 2pp, twocolum

    Entropy Distance: New Quantum Phenomena

    Full text link
    We study a curve of Gibbsian families of complex 3x3-matrices and point out new features, absent in commutative finite-dimensional algebras: a discontinuous maximum-entropy inference, a discontinuous entropy distance and non-exposed faces of the mean value set. We analyze these problems from various aspects including convex geometry, topology and information geometry. This research is motivated by a theory of info-max principles, where we contribute by computing first order optimality conditions of the entropy distance.Comment: 34 pages, 5 figure
    • 

    corecore